Search Results: "bremner"

28 August 2011

Cyril Brulebois: Everybody be cool

this is not a robbery, only a regular X server branch switch, from 1.10 to 1.11. That means X drivers (for input and video) need to be rebuilt against the new server, which is happening through binNMUs. That also means the X stack from sid might be uninstallable for a few hours, but impatient people can still use the stack from wheezy in the meanwhile. Please refrain from reporting uninstallability bugs. That s expected, can t be avoided, and only for a few hours (assuming no driver starts to Fail To Build From Source). Many thanks to the following people, who joined the let s do a pre-upload crash test effort:

14 August 2011

Antonio Terceiro: Handling upstream patches with git-export-debian-patches

These days I briefly discussed with a fellow Debian developer about how to maintain upstream patches in Debian packages with Git, what brought me to rethink a little about my current practices. What I usually do is pretty much like point 4 in Raphael's post "4 tips to maintain a 3.0 (quilt) Debian source package in a VCS": I make commits in the Debian packaging branch, or in a separate branch that is merged into the Debian packaging branch. Then I add the single-debian-patch option to debian/source/options so that a single Debian patch is generated, and include a patch header that points people interested in the individual changes to the public Git repository where they were originally done. My reasoning for doing so was the following: most upstream developers will hardly care enough to come check the patches applied against their source in Debian, so it's not so important to have a clean source package with separated and explained patches. But then there is the people who will actually care about the patches: other distribution developers. Not imposing a specific VCS on them to review the patches applied in Debian is a nice thing to do. Then I wrote a script called git-export-debian-patches (download, manpage), which was partly inspired by David Bremner's script. It exports all commits in the Debian packaging branch that do not touch files under debian/ and were not applied upstream to debian/patches. The script also creates an appropriate debian/patches/series files. The script is even smart enough to detect patches that were later reverted in the Debian branch and exclude them (and the commit that reverted them) from the patch list. The advantage I see over gbp-pq is that I don't need to rebase (and thus lose history) to have a clean set of patches. The advantage over the gitpkg quilt-patches-deb-export-hook hook is that I don't need to explicitly say which ranges I want: every change that is merged in master, was not applied upstream and was not reverted gets listed as a patch. To be honest I don't have any experience with either gbp-pq or gitpkg and these advantages were based on what I read, so please leave a (nice ;-)) comment if I said something stupid. I am looking forward to receive feedback about the tool, specially about potential corner cases in which it would break. For now I have tested it in a package with simple changes agains upstream source, and it seems fine.

13 March 2011

David Bremner: Debugging Duplicity Duplication

It seems kind of unfair, given the name, but duplicity really doesn't like to be run in parallel. This means that some naive admin (not me of course, but uh, this guy I know ;) ) who writes a crontab
 @daily  duplicity incr $ARGS $SRC $DEST
 @weekly duplicity full $ARGS $SRC $DEST 
is in for a nasty surprise when both fire at the same time. In particular one of them will terminate with the not very helpful.
 AttributeError: BackupChain instance has no attribute 'archive_dir'
After some preliminary reading of mailing list archives, I decided to delete ~/.cache/duplicity on the client and try again. This was not a good move.
  1. It didn't fix the problem
  2. Resyncing from the server required decrypting some information, which required access to the gpg private key.
Now for me, one of the main motivations for using duplicity was that I could encrypt to a key without having the private key accessible. Luckily the following crazy hack works.
  1. A host where the gpg private key is accessible, delete the ~/.cache/duplicity, and perform some arbitrary duplicity operation. I did duplicity clean $DEST
  2. Now rsync the ./duplicity/cache directory to the backup client.
Now at first you will be depressed, because the problem isn't fixed yet. What you need to do is go onto the backup server (in my case Amazon s3) and delete one of the backups (in my case, the incremental one). Of course, if you are the kind of reader who skips to the end, probably just doing this will fix the problem and you can avoid the hijinks. And, uh, some kind of locking would probably be a good plan... For now I just stagger the cron jobs.

30 January 2011

David Bremner: Yet another git+quilt Debian packaging workflow

As of version 0.17, gitpkg ships with a hook called quilt-patches-deb-export-hook. This can be used to export patches from git at the time of creating the source package. This is controlled by a file debian/source/git-patches. Each line contains a range suitable for passing to git-format-patch(1). The variables UPSTREAM_VERSION and DEB_VERSION are replaced with values taken from debian/changelog. Note that $UPSTREAM_VERSION is the first part of $DEB_VERSION An example is
 upstream/$UPSTREAM_VERSION..patches/$DEB_VERSION
 upstream/$UPSTREAM_VERSION..embedded-libs/$DEB_VERSION
This tells gitpkg to export the given two ranges of commits to debian/patches while generating the source package. Each commit becomes a patch in debian/patches, with names generated from the commit messages. In this example, we get 5 patches from the two ranges.
 0001-expand-pattern-in-no-java-rule.patch
 0002-fix-dd_free_global_constants.patch
 0003-Backported-patch-for-CPlusPlus-name-mangling-guesser.patch
 0004-Use-system-copy-of-nauty-in-apps-graph.patch
 0005-Comment-out-jreality-installation.patch
Thanks to the wonders of 3.0 (quilt) packages, these are applied when the source package is unpacked. Caveats.

8 January 2011

David Bremner: Beamer overlays in highlighted source code

I use a lot of code in my lectures, in many different programming languages. I use highlight to generate HTML (via ikiwiki) for web pages. For class presentations, I mostly use the beamer LaTeX class. In order to simplify generating overlays, I wrote a perl script hl-beamer.pl to preprocess source code. An htmlification of the documention/man-page follows.
NAME hl-beamer - Preprocessor for hightlight to generate beamer overlays. SYNOPSIS hl-beamer -c // InstructiveExample.java highlight -S java -O latex > figure1.tex DESCRIPTION hl-beamer looks for single line comments (with syntax specified by -c) These comments can start with @ followed by some codes to specify beamer overlays or sections (just chunks of text which can be selectively included). OPTIONS CODES EXAMPLE Example input follows. I would probably process this with
hl-beamer -s 4 -k encodeInner
Sample Input
 // @( omit
 import java.io.BufferedReader;
 import java.io.FileReader;
 import java.io.IOException;
 import java.io.Serializable;
 import java.util.Scanner;
 // @)
     // @( encoderInner
     private int findRun(int inRow, int startCol) 
         // @<
         int value=bits[inRow][startCol];
         int cursor=startCol;
         // @>
         // @<
         while(cursor<columns && 
               bits[inRow][cursor] == value) 
             //@<
             cursor++;
             //@>
         // @>
         // @<
         return cursor-1;
         // @>
      
     // @)
BUGS AND LIMITATIONS Currently overlaytype and section must consist of upper and lower case letters and or underscores. This is basically pure sloth on the part of the author. Tabs are always expanded to spaces.

17 December 2010

Pietro Abate: debian git packaging with git upstream

Update There is an easier method to do all this using gbp-clone as described here. Ah ! Then to build the package, you just need to suggest git-buildpackage where to find the pristin-tar :
git-buildpackage --git-upstream-branch=upstream/master
or you could simply describe (as suggested) the layout in debian/gbp.conf. Easy !!!
I've found a lot of different recipes and howtos about git debian packaging, but I failed to find one simple recipe to create a debian package from scratch when upstream is using git. Of course the following is a big patchwork from many different sources. First we need to do a bit of administrative work to setup the repository :
mkdir yourpackage
cd yourpackage
git init --shared
Then, since I'm interested in tracking upstream development branch I'm going to add a remote branch to my repo:
git remote add upstream git://the.url/here.git
at this point I need to fetch upstream and create a branch for it.
git fetch upstream
git checkout -b upstream upstream/master
Now in my repo I have a master branch and an upstream branch. So far, so good. Let's add the debian branch based on master:
git checkout master
git checkout -b debian master
It's in the debian branch where I'm going to keep the debian related files. I'm finally read for hacking git add / git commit / git remove ... When I'm done, I can switch to master, merge the debian branch into it and use git-buildpackage to build the package.
git checkout master
git branch
debian
* master
upstream

git merge debian
git-buildpackage
Suppose I want to put everything on gitourious for example. I'll create an acocunt, set up my ssh pub key and then I've to add an origin ref in my .git/config . Something like :
[remote "origin"]
url = git@gitorious.org:debian-stuff/yourpackage.git
fetch = +refs/heads/*:refs/remotes/origin/*
[branch "master"]
remote = origin
merge = refs/heads/master
The only thing left to do is to push everything on gitourious. the --all is important.
git push --all
People willing to pull your work from girourious have to follow the following script :
$git clone git@gitorious.org:debian-stuff/yourpackage.git
$cd yourpackage
$git branch -a
* master
remotes/origin/HEAD -> origin/master
remotes/origin/debian
remotes/origin/master
remotes/origin/upstream
$git checkout -t origin/debian
$git checkout -t origin/upstream
$git branch -a
debian
master
* upstream
remotes/origin/HEAD -> origin/master
remotes/origin/debian
remotes/origin/master
remotes/origin/upstream
$git checkout master
$git-buildpackage
Maybe there is an easier way to pull all remote branches at once, but I'm not aware of it. Any better way ?

11 December 2010

David Bremner: Which git commits should I send to upstream?

I recently decided to try maintaining a Debian package (bibutils) without committing any patches to Git. One of the disadvantages of this approach is that the patches for upstream are not nicely sorted out in ./debian/patches. I decided to write a little tool to sort out which commits should be sent to upstream. I'm not too happy about the length of it, or the name "git-classify", but I'm posting in case someone has some suggestions. Or maybe somebody finds this useful.
#!/usr/bin/perl
use strict;
my $upstreamonly=0;
if ($ARGV[0] eq "-u") 
  $upstreamonly=1;
  shift (@ARGV);
 
open(GIT,"git log -z --format=\"%n%x00%H\" --name-only  @ARGV ");
# throw away blank line at the beginning.
$_=<GIT>;
my $sha="";
LINE: while(<GIT>) 
  chomp();
  next LINE if (m/^\s*$/);
  if (m/^\x0([0-9a-fA-F]+)/) 
    $sha=$1;
    else  
    my $debian=0;
    my $upstream=0;
    foreach my $word  ( split("\x00",$_) )  
      if  ($word=~m@^debian/@)  
    $debian++;
        elsif (length($word)>0)   
    $upstream++;
       
     
    if (!$upstreamonly) 
      print "$sha\t";
      print "MIXED" if ($upstream>0  && $debian>0);
      print "upstream" if ($upstream>0  && $debian==0);
      print "debian" if ($upstream==0  && $debian>0);
      print "\n";
      else  
      print "$sha\n" if ($upstream>0  && $debian==0);
     
   
 
=pod
=head1 Name
git-classify  - Classify commits as upstream, debian, or MIXED
=head1 Synopsis
=over
=item B<git classify> [I<-u>] [I<arguments for git-log>]
=back
=head1 Description
Classify a range of commits (specified as for git-log) as I<upstream>
(touching only files outside ./debian), I<debian> (touching files only
inside ./debian) or I<MIXED>. Presumably these last kind are to be
discouraged.
=head2 Options
=over
=item B<-u> output only the SHA1 hashes of upstream commits (as
      defined above).
=back
=head1 Examples
Generate all likely patches to send upstream
     git classify -u $SHA..HEAD   xargs -L1 git format-patch -1

David Bremner: Converting META.yml to META.json

Before I discovered you could just point your browser at http://search.cpan.org/meta/Dist-Name-0.007/META.json to automagically convert META.yml and META.json, I wrote a script to do it.
Anyway, it goes with my "I hate the cloud" prejudices :).
use CPAN::Meta;
use CPAN::Meta::Converter;
use Data::Dumper;
my $meta = CPAN::Meta->load_file("META.yml");
my $cmc = CPAN::Meta::Converter->new($meta);
my $new=CPAN::Meta->new($cmc->convert(version=>"2"));
$new->save("META.json");

3 December 2010

Debian News: New Debian Developers (November 2010)

The following developers got their Debian accounts in the last month: Congratulations!

31 October 2010

David Bremner: Extracting text from pdf with pdfedit

It turns out that pdfedit is pretty good at extracting text from pdf files. Here is a script I wrote to do that in batch mode.
#!/bin/sh
# Print the text from a pdf document on stdout
# Copyright: (c) 2006-2010 PDFedit team  <http://sourceforge.net/projects/pdfedit>
# Copyright: (c) 2010, David Bremner <david@tethera.net>
# Licensed under version 2 or later of the GNU GPL
set -e
if [ $# -lt 1 ]; then
    echo usage: $0 file [pageSep]
    exit 1
fi
#!/bin/sh
# Print the text from a pdf document on stdout
# Copyright:   2006-2010 PDFedit team  <http://sourceforge.net/projects/pdfedit>
# Copyright:   2010, David Bremner <david@tethera.net>
# Licensed under version 2 or later of the GNU GPL
set -e
if [ $# -lt 1 ]; then
    echo usage: $0 file [pageSep]
    exit 1
fi
/usr/bin/pdfedit -console -eval '
function onConsoleStart()  
    var inName = takeParameter();
    var pageSep = takeParameter();
    var doc = loadPdf(inName,false);

    pages=doc.getPageCount();
    for (i=1;i<=pages;i++)  
        pg=doc.getPage(i);
        text=pg.getText();
        print(text);
        print("\n");
        print(pageSep);
     
 
' $1 $2
Yeah, I wish #!/usr/bin/pdfedit worked too. Thanks to Aaron M Ucko for pointing out that -eval could replace the use of a temporary file. Oh, and pdfedit will be even better when the authors release a new version that fixes truncating wide text

10 October 2010

Gregor Herrmann: RC bugs 2010/40

a rather active week, although most of my uploads were (again!) either packages prepared by others or packages where I just applied existing patches from the BTS. in case you haven't seen it yet: there will be a BSP at the Mini DebConf Paris at the end of october. .

7 October 2010

David Bremner: Tags are notmuch the point

Dear Julien; After using notmuch for a while, I came to the conclusion that tags are mostly irelevant. What is a game changer for me is fast global search. And yes, I changed from using dovecot search, so I mean much faster than that. Actually I remember that from the Human Computer Interface course that I took in the early Neolithic era that speed of response has been measured as a key factor in interfaces, so maybe it isn't just me. Of course there are tradeoffs, some of which you mention. David

19 August 2010

Jonathan Wiltshire: Batch importing caff signatures

Having swapped details with many, many people at Debconf, and then been away for a week after that, I found myself with an overflowing mailbox and a long task of open mail, provide pass-phrase, pipe to gpg import . I wanted a way to batch-import all these signatures (there are three times as many, because my key has three UIDs) in one or two goes, and tidy up the stragglers later. David Bremner wrote a small Perl script to do this from an mbox file, but I wanted to work in pure shell and with mutt. Just shoving the mbox at gpg resulted in it decrypting one message, then bailing at the fact the IDEA plugin is not present. Here was my eventual workflow, which only requires you to provide the pass-phrase once:
  1. create a maildir, either with maildir-make or a directory with cur, new and tmp directories nested inside;
  2. mark all relevant messages as read, and save them to here (it doesn t matter if others get caught up in it);
  3. now change to the maildir /cur directory, and run the following bash (disclaimer: totally untested and used at your own risk):

    for a in ls ; do mv $a $a.gpg; done
    gpg --decrypt-files *.gpg
    rm *.gpg
    gpg --import *
    rm *
I expect there are better/quicker/safer ways to do it, but this worked well for me at midnight on a Monday evening. 19/08/10: Yes, it turns out I am a numpty, and Mutt can handle this all by itself with Ctrl-K and a tagged list. This is still quite handy when the private key is not on the machine you re using to read mail, though. Thanks for the corrections.
Comments flattr this!

12 August 2010

David Bremner: Batch processing mails from caff

What is it? I was a bit daunted by the number of mails from people signing my gpg keys at debconf, so I wrote a script to mass process them. The workflow, for those of you using notmuch is as follows:
$ notmuch show --format=mbox tag:keysign > sigs.mbox
$ ffac sigs.mbox
where previously I have tagged keysigning emails as "keysign" if I want to import them. You also need to run gpg-agent, since I was too lazy/scared to deal with passphrases. This will import them into a keyring in ~/.ffac; uploading is still manual using something like
$ gpg --homedir=$HOME/.ffac --send-keys $keyid 
UPDATE Before you upload all of those shiny signatures, you might want to use the included script fetch-sig-keys to add the corresponding keys to the temporary keyring in ~/.ffac. After
$ fetch-sig-keys $keyid
then
$ gpg --homedir ~/.ffac --list-sigs $keyid  
should have a UID associated with each signature. How do I use it At the moment this is has been tested once or twice by one person. More testing would be great, but be warned this is pre-release software until you can install it with apt-get. I have a patched version of the debian package that I could make available if there was interest.

4 August 2010

Michael Banck: 4 Aug 2010

Science and Math Track at DebConf10 This year's DebConf10 (which is great, by the way) at Columbia University, New York will feature Tracks for the first time. We had a Community Outreach track on Debian Day (to be continued by more awesome talks over the rest of the week), a Java track on Monday and an Enterprise track yesterday. Tomorrow, Thursday afternoon, the S cience and Math track (which I am organizing) will take place in the Interschool lab on level 7 of Schapiro Center. The Track will start at 14:00 with a short welcome from me, followed by presentations of debian-science by Sylvestre Ledru and debian-math by David Bremner. At 15:00, Michael Hanke and Yaroslav Halchenko will present their talk on "Debian as the ultimate platform for neuroimaging research". This will be followed at 16:00 by three mini-talks on "New developments in Science Packaging". Adam C. Powell, IV will talk about MPI, Sylvestre Ledru will present linear algebra implementations in Debian and finally Michael Hanke and Yaroslav Halchenko will discuss the citation/reference infrastructure. At the end of track, the annual debian-science round-table will happen at 17:00, where David Bremner (mathematics), Michael Hanke (neuro-debian), Sylvestre Ledru (debian- science/pkg-scicomp), Adam C. Powell, IV (debian- science/pkg-scicomp) and myself (debichem) will discuss matters about cross-field debian-science and math related topics. If afterwards there are still outstanding matters to be discussed, we can schedule ad-hoc sessions for science or math matters on Friday or Saturday. See you at the science track tomorrow!

24 June 2010

David Bremner: Yet another tale of converting Debian packaging to Git

racket (previously known as plt-scheme) is an interpreter/JIT-compiler/development environment with about 6 years of subversion history in a converted git repo. Debian packaging has been done in subversion, with only the contents of ./debian in version control. I wanted to merge these into a single git repository. The first step is to create a repo and fetch the relevant history.
TMPDIR=/var/tmp
export TMPDIR
ME= readlink -f $0 
AUTHORS= dirname $ME /authors
mkdir racket && cd racket && git init
git remote add racket git://git.racket-lang.org/plt
git fetch --tags racket
git config  merge.renameLimit 10000
git svn init  --stdlayout svn://svn.debian.org/svn/pkg-plt-scheme/plt-scheme/
git svn fetch -A$AUTHORS
git branch debian
A couple points to note: Now a couple complications arose about upstream's git repo.
  1. Upstream releases seperate source tarballs for unix, mac, and windows. Each of these is constructed by deleting a large number of files from version control, and occasionally some last minute fiddling with README files and so on.
  2. The history of the release tags is not completely linear. For example,
rocinante:~/projects/racket  (git-svn)-[master]-% git diff --shortstat v4.2.4  git merge-base v4.2.4 v5.0 
 48 files changed, 242 insertions(+), 393 deletions(-)
rocinante:~/projects/racket  (git-svn)-[master]-% git diff --shortstat v4.2.1  git merge-base v4.2.1 v4.2.4 
 76 files changed, 642 insertions(+), 1485 deletions(-)
The combination made my straight forward attempt at constructing a history synched with release tarballs generate many conflicts. I ended up importing each tarball on a temporary branch, and the merges went smoother. Note also the use of "git merge -s recursive -X theirs" to resolve conflicts in favour of the new upstream version. The repetitive bits of the merge are collected as shell functions.
import_tgz()  
    if [ -f $1 ]; then
        git clean -fxd;
        git ls-files -z   xargs -0 rm -f;
        tar --strip-components=1 -zxvf $1 ;
        git add -A;
        git commit -m'Importing ' basename $1 ;
    else
        echo "missing tarball $1";
    fi;
 
do_merge()  
    version=$1
    git checkout -b v$version-tarball v$version
    import_tgz ../plt-scheme_$version.orig.tar.gz
    git checkout upstream
    git merge -s recursive -X theirs v$version-tarball
 
post_merge()  
    version=$1
    git tag -f upstream/$version
    pristine-tar commit ../plt-scheme_$version.orig.tar.gz
    git branch -d v$version-tarball
 
The entire merge script is here. A typical step looks like
do_merge 5.0
git rm collects/tests/stepper/automatic-tests.ss
git add  git status -s   egrep ^UA   cut -f2 -d' ' 
git checkout v5.0-tarball doc/release-notes/teachpack/HISTORY.txt
git rm readme.txt
git add  collects/tests/web-server/info.rkt
git commit -m'Resolve conflicts from new upstream version 5.0'
post_merge 5.0
Finally, we have the comparatively easy task of merging the upstream and Debian branches. In one or two places git was confused by all of the copying and renaming of files and I had to manually fix things up with git rm.
cd racket   /bin/true
set -e
git checkout debian
git tag -f packaging/4.0.1-2  git svn find-rev r98 
git tag -f packaging/4.2.1-1  git svn find-rev r113 
git tag -f packaging/4.2.4-2  git svn find-rev r126 
git branch -f  master upstream/4.0.1
git checkout master
git merge packaging/4.0.1-2
git tag -f debian/4.0.1-2
git merge upstream/4.2.1
git merge packaging/4.2.1-1
git tag -f debian/4.2.1-1
git merge upstream/4.2.4
git merge packaging/4.2.4-2
git rm collects/tests/stxclass/more-tests.ss && git commit -m'fix false rename detection'
git tag -f debian/4.2.4-2
git merge -s recursive -X theirs upstream/5.0
git rm collects/tests/web-server/info.rkt
git commit -m 'Merge upstream 5.0'

30 March 2010

David Bremner: Distributed Issue Tracking with Git

I'm thinking about distributed issue tracking systems that play nice with git. I don't care about other version control systems anymore :). I also prefer command line interfaces, because as commentators on the blog have mentioned, I'm a Luddite (in the imprecise, slang sense). So far I have found a few projects, and tried to guess how much of a going concern they are. Git Specific VCS Agnostic Sortof VCS Agnostic

14 March 2010

David Bremner: Functional programming on the JVM

I'm collecting information (or at least links) about functional programming languages on the the JVM. I'm going to intentionally leave "functional programming language" undefined here, so that people can have fun debating :). Functional Languages Languages with functional features Projects and rumours.

6 March 2010

David Bremner: Mirroring a gitolite collection

You have a gitolite install on host $MASTER, and you want a mirror on $SLAVE. Here is one way to do that. $CLIENT is your workstation, that need not be the same as $MASTER or $SLAVE.
  1. On $CLIENT, install gitolite on $SLAVE. It is ok to re-use your gitolite admin key here, but make sure you have both public and private key in .ssh, or confusion ensues. Note that when gitolite asks you to double check the "host gitolite" ssh stanza, you probably want to change hostname to $SLAVE, at least temporarily (if not, at least the checkout of the gitolite-admin repo will fail) You may want to copy .gitolite.rc from $MASTER when gitolite fires up an editor.
  2. On $CLIENT copy the "gitolite" stanza of .ssh/config to gitolite-mirror to a stanza called e.g. gitolite-slave fix the hostname of the gitolite stanza so it points to $MASTER again.
  3. On $MASTER, as gitolite user, make passphraseless ssh-key. Probably you should call it something like 'mirror'
  4. Still on $MASTER. Add a stanza like the following to $gitolite_user/.ssh/config
     host gitolite-mirror
       hostname $SLAVE
       identityfile ~/.ssh/mirror
    
    run ssh gitolite-mirror at least once to test and set up any "know_hosts" file.
  5. On $CLIENT change directory to a checkout of gitolite admin from $MASTER. Make sure it is up to date with respect origin
    git pull
    
  6. Edit .git/config (or, in very recent git, use git remote seturl --push --add) so that remote origin looks like
    fetch = +refs/heads/*:refs/remotes/origin/*
    url = gitolite:gitolite-admin
    pushurl = gitolite:gitolite-admin
    pushurl = gitolite-slave:gitolite-admin
    
  7. Add a stanza
    repo @all
      RW+     = mirror
    
    to the bottom of your gitolite.conf Add mirror.pub to keydir.
  8. Now overwrite the gitolite-admin repo on $SLAVE git push -f Note that empty repos will be created on $SLAVE for every repo on $MASTER.
  9. The following one line post-update hook to any repos you want mirrored (see the gitolite documentation for how to automate this) You should not modify the post update hook of the gitolite-admin repo. git push --mirror gitolite-mirror:$GL_REPO.git
  10. Create repos as per normal in the gitolite-admin/conf/gitolite.conf. If you have set the auto post-update hook installation, then each repo will be mirrored. You should only push to $MASTER; any changes pushed to $SLAVE will be overwritten.

13 December 2009

David Bremner: Reading MPS files with glpk

Recently I was asked how to read mps (old school linear programming input) files. I couldn't think of a completely off the shelf way to do, so I write a simple c program to use the glpk library. Of course in general you would want to do something other than print it out again.

Next.

Previous.